Information Bottleneck Theory on Convolutional Neural Networks

نویسندگان

چکیده

Recent years, many researches attempt to open the black box of deep neural networks and propose a various theories understand it. Among them, information bottleneck (IB) theory claims that there are two distinct phases consisting fitting phase compression in course training. This statement attracts attentions since its success explaining inner behavior feedforward networks. In this paper, we employ IB dynamic convolutional (CNNs) investigate how fundamental features such as layer width, kernel size, network depth, pooling layers multi-fully connected have impact on performance CNNs. particular, through series experimental analysis benchmark MNIST Fashion-MNIST, demonstrate is not observed all these cases. shows us CNNs rather complicated than

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Information-Theoretic Discussion of Convolutional Bottleneck Features for Robust Speech Recognition

Convolutional Neural Networks (CNNs) have been shown their performance in speech recognition systems for extracting features, and also acoustic modeling. In addition, CNNs have been used for robust speech recognition and competitive results have been reported. Convolutive Bottleneck Network (CBN) is a kind of CNNs which has a bottleneck layer among its fully connected layers. The bottleneck fea...

متن کامل

Incorporating Prototype Theory in Convolutional Neural Networks

Deep artificial neural networks have made remarkable progress in different tasks in the field of computer vision. However, the empirical analysis of these models and investigation of their failure cases has received attention recently. In this work, we show that deep learning models cannot generalize to atypical images that are substantially different from training images. This is in contrast t...

متن کامل

Notes on Convolutional Neural Networks

We discuss the derivation and implementation of convolutional neural networks, followed by an extension which allows one to learn sparse combinations of feature maps. The derivation we present is specific to two-dimensional data and convolutions, but can be extended without much additional effort to an arbitrary number of dimensions. Throughout the discussion, we emphasize efficiency of the imp...

متن کامل

Cystoscopy Image Classication Using Deep Convolutional Neural Networks

In the past three decades, the use of smart methods in medical diagnostic systems has attractedthe attention of many researchers. However, no smart activity has been provided in the eld ofmedical image processing for diagnosis of bladder cancer through cystoscopy images despite the highprevalence in the world. In this paper, two well-known convolutional neural networks (CNNs) ...

متن کامل

Layer-wise Learning of Stochastic Neural Networks with Information Bottleneck

In this paper, we present a layer-wise learning of stochastic neural networks (SNNs) in an information-theoretic perspective. In each layer of an SNN, the compression and the relevance are defined to quantify the amount of information that the layer contains about the input space and the target space, respectively. We jointly optimize the compression and the relevance of all parameters in an SN...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neural Processing Letters

سال: 2021

ISSN: ['1573-773X', '1370-4621']

DOI: https://doi.org/10.1007/s11063-021-10445-6